78 research outputs found

    Estimación adaptativa Bayesiana Aplicada a la Localización de Usuarios Móviles

    Get PDF
    La popularidad de los sistemas de posicionamiento satelitales en espacios abiertos ha generado una fuerte demanda de sistemas que los suplan en entornos complejos, donde estos fallan. Sin embargo, las características del canal de propagación inalámbrico en estos entornos son dinámicas e impredecibles. A su vez, existen situaciones en las que no hay una infraestructura inalámbrica operativa. En esta Tesis Doctoral , presentamos un marco teórico y algoritmos para la fusión de datos en sistemas de localización desplegados en entornos complejos. Las técnicas presentadas hacen uso de modelos adaptativos para acomodarse a las condiciones cambiantes del canal de propagación. Estas técnicas fusionan información previa y medidas de tiempo de llegada, potencia recibida, fuerza y velocidad angular, de manera óptima desde un punto de vista Bayesiano. Tanto los resultados empíricos como de simulación muestran una importante mejora respecto a los enfoques convencionales, obteniendo un error próximo a la cota de Cramér-Rao.Departamento de Teoría de la Señal y Comunicaciones e Ingeniería Telemátic

    A Robust Multi-Sensor PHD Filter Based on Multi-Sensor Measurement Clustering

    Get PDF
    [EN] This letter presents a novel multi-sensor probability hypothesis density (PHD) filter for multi-target tracking by means of multiple or even massive sensors that are linked by a fusion center or by a peer-to-peer network. As a challenge, we find there is little known about the statistical properties of the sensors in terms of their measurement noise, clutter, target detection probability, and even potential cross-correlation. Our approach converts the collection of the measurements of different sensors to a set of proxy and homologous measurements. These synthetic measurements overcome the problems of false and missing data and of unknown statistics, and facilitate linear PHD updating that amounts to the standard PHD filtering with no false and missing data. Simulation has demonstrated the advantages and limitations of our approach in comparison with the cutting-edge multi-sensor/distributed PHD filters

    Context-aided Inertial Navigation via Belief Condensation

    Get PDF
    Inertial navigation systems suffer from drift errors that degrade their performance. The main existing techniques to mitigate such impairment are based on the detection of stance phases under the specific situational context of pedestrian walking with a foot-mounted inertial measurement unit (IMU). Existing approaches achieve acceptable performances only under simple circumstances, such as smooth movements and short periods of time. In addition, they lack a principled unifying methodology to exploit contextual information. In this paper, we establish a general framework for context-aided inertial navigation and present efficient algorithms for its implementation based on the inference technique called belief condensation (BC). We evaluate the proposed techniques against the state of the art through the experimental case study of pedestrian walking with a foot-mounted IMU. Our results show that the proposed techniques can remarkably improve the navigation accuracy while keeping moderate complexities

    Fault-Tolerant Temperature Control Algorithm for IoT Networks in Smart Buildings

    Get PDF
    [EN] The monitoring of the Internet of things networks depends to a great extent on the availability and correct functioning of all the network nodes that collect data. This network nodes all of which must correctly satisfy their purpose to ensure the efficiency and high quality of monitoring and control of the internet of things networks. This paper focuses on the problem of fault-tolerant maintenance of a networked environment in the domain of the internet of things. Based on continuous-time Markov chains, together with a cooperative control algorithm, a novel feedback model-based predictive hybrid control algorithm is proposed to improve the maintenance and reliability of the internet of things network. Virtual sensors are substituted for the sensors that the algorithm predicts will not function properly in future time intervals; this allows for maintaining reliable monitoring and control of the internet of things network. In this way, the internet of things network improves its robustness since our fault tolerant control algorithm finds the malfunction nodes that are collecting incorrect data and self-correct this issue replacing malfunctioning sensors with new ones. In addition, the proposed model is capable of optimising sensor positioning. As a result, data collection from the environment can be kept stable. The developed continuous-time control model is applied to guarantee reliable monitoring and control of temperature in a smart supermarket. Finally, the efficiency of the presented approach is verified with the results obtained in the conducted case study

    Evaluation of points of improvement in NGS data analysis

    Get PDF
    [EN]DNA sequencing is a fundamental technique in molecular biology that allows the exact sequence of nucleotides in a DNA sample to be read. Over the past decades, DNA sequencing has seen significant advances, evolving from manual and laborious techniques to modern high-throughput techniques. Despite these advances, interpretation and analysis of sequencing data continue to present challenges. Artificial Intelligence (AI), and in particular machine learning, has emerged as an essential tool to address these challenges. The application of AI in the sequencing pipeline refers to the use of algorithms and models to automate, optimize and improve the precision of the sequencing process and its subsequent analysis. The Sanger sequencing method, introduced in the 1970s, was one of the first to be widely used. Although effective, this method is slow and is not suitable for sequencing large amounts of DNA, such as entire genomes. With the arrival of next generation sequencing (NGS) in the 21st century, greater speed and efficiency in obtaining genomic data has been achieved. However, the exponential increase in the amount of data produced has created a bottleneck in its analysis and interpretation

    Application of hybrid algorithms and Explainable Artificial Intelligence ingenomic sequencing

    Get PDF
    [EN]DNA sequencing is one of the fields that has advanced the most in recent years within clinical genetics and human biology. However, the large amount of data generated through next generation sequencing (NGS) techniques requires advanced data analysis processes that are sometimes complex and beyond the capabilities of clinical staff. Therefore, this work aims to shed light on the possibilities of applying hybrid algorithms and explainable artificial intelligence (XAI) to data obtained through NGS. The suitability of each architecture will be evaluated phase by phase in order to offer final recommendations that allow implementation in clinical sequencing workflow

    Deep Symbolic Learning Architecture for Variant Calling in NGS

    Get PDF
    [EN]The Variant Detection process (Variant Calling) is fundamental in bioinformatics, demanding maximum precision and reliability. This study examines an innovative integration strategy between a traditional pipeline developed in-house and an advanced Intelligent System (IS). Although the original pipeline already had tools based on traditional algorithms, it had limitations, particularly in the detection of rare or unknown variants. Therefore, SI was introduced with the aim of providing an additional layer of analysis, capitalizing on deep and symbolic learning techniques to improve and enhance previous detections. The main technical challenge lay in interoperability. To overcome this, NextFlow, a scripting language designed to manage complex bioinformatics workflows, was employed. Through NextFlow, communication and efficient data transfer between the original pipeline and the SI were facilitated, thus guaranteeing compatibility and reproducibility. After the Variant Calling process of the original system, the results were transmitted to the SI, where a meticulous sequence of analysis was implemented, from preprocessing to data fusion. As a result, an optimized set of variants was generated that was integrated with previous results. Variants corroborated by both tools were considered to be of high reliability, while discrepancies indicated areas for detailed investigations. The product of this integration advanced to subsequent stages of the pipeline, usually annotation or interpretation, contextualizing the variants from biological and clinical perspectives. This adaptation not only maintained the original functionalities of the pipeline, but was also enhanced with the SI, establishing a new standard in the Variant Calling process. This research offers a robust and efficient model for the detection and analysis of genomic variants, highlighting the promise and applicability of blended learning in bioinformaticsThis study has been funded by the AIR Genomics project (with file number CCTT3/20/SA/0003), through the call 2020 R&D PROJECTS ORIENTED TO THE EXCELLENCE AND COMPETITIVE IMPROVEMENT OF THE CCTT by the Institute of Business Competitiveness of Castilla y León and FEDER fund
    corecore